Connecting Belief Propagation with Maximum Likelihood De- tection

نویسندگان

  • John MacLaren Walsh
  • Phillip Allan Regalia
چکیده

While its performance in the Gaussian, infinite block length, and loopless factor graph cases are well understood, the breakthrough applications of the belief propagation algorithm to the decoding of turbo and LDPC codes involve finite block lengths, finite alphabets, and factor graphs with loops. It has been shown in these instances that the stationary points of the belief propagation decoder are the critical points of the Bethe approximation to the free energy. However, this connection does not clearly explain why the stationary points of belief propagation yield good performance, since the approximation is not in general exact when there are loops in the graph. We introduce an alternate constrained maximum likelihood optimization problem here which analytically connects the stationary points of belief propagation with the maximum likelihood sequence detector.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Subgraph Detection with cues using Belief Propagation

We consider an Erdős-Rényi graph with n nodes and edge probability q that is embedded with a random subgraph of size K with edge probabilities p such that p > q. We address the problem of detecting the subgraph nodes when only the graph edges are observed, along with some extra knowledge of a small fraction of subgraph nodes, called cued vertices or cues. We employ a local and distributed algor...

متن کامل

Near Maximum Likelihood Decoding with Deep Learning

A novel and efficient neural decoder algorithm is proposed. The proposed decoder is based on the neural Belief Propagation algorithm and the Automorphism Group. By combining neural belief propagation with permutations from the Automorphism Group we achieve near maximum likelihood performance for High Density Parity Check codes. Moreover, the proposed decoder significantly improves the decoding ...

متن کامل

Approximate Expectation Maximization

We discuss the integration of the expectation-maximization (EM) algorithm for maximum likelihood learning of Bayesian networks with belief propagation algorithms for approximate inference. Specifically we propose to combine the outer-loop step of convergent belief propagation algorithms with the M-step of the EM algorithm. This then yields an approximate EM algorithm that is essentially still d...

متن کامل

Adaptive Maximum-Likelihood Decoding Algorithms for Linear Block Codes

This correspondence represents two new soft decision decoding algorithms that promise to reduce complexity and at the same time achieve the maximum likelihood decoding (MLD) performance. The first method is an Adaptive Two-Stage Maximum Likelihood Decoder [1] that first estimates a minimum sufficient set and performs decoding within the smaller set to reduce complexity and at the same time achi...

متن کامل

Solving Non-parametric Inverse Problem in Continuous Markov Random Field using Loopy Belief Propagation

In this paper, we address the inverse problem, or the statistical machine learning problem, in Markov random fields with a non-parametric pair-wise energy function with continuous variables. The inverse problem is formulated by maximum likelihood estimation. The exact treatment of maximum likelihood estimation is intractable because of two problems: (1) it includes the evaluation of the partiti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005